skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Buchanan, Austin"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Political Districting to Minimize County Splits When dividing a state into districts for elections, one traditional criterion is that political subdivisions like counties and cities should not be divided unnecessarily. Some states go as far as to say that the number of county splits should be minimized, but previously there was no scalable exact method for determining this. With new integer programming techniques, Shahmizad and Buchanan exactly compute this minimum number for all states and district types (congressional, state senate, state house) across the USA. 
    more » « less
    Free, publicly-accessible full text available March 1, 2026
  2. Consider the task of dividing a state into k contiguous political districts whose populations must not differ by more than one person, following current practice for congressional districting in the USA. A widely held belief among districting experts is that this task requires at least k − 1 county splits. This statement has appeared in expert testimony, special master reports, and Supreme Court oral arguments. In this article, we seek to dispel this belief. To illustrate, we find plans for several states that use zero county splits, that is, all counties are kept whole, despite satisfying contiguity and 1-person deviation. This is not a rare phenomenon; states like Iowa and Montana admit hundreds, thousands, or tens of thousands of such plans. In practice, mapmakers may need to satisfy additional criteria, like compactness, minority representation, and partisan fairness, which may lead them to believe k − 1 splits to be minimum. Again, this need not be true. To illustrate, we conduct short case studies for North Carolina (for partisan fairness) and Alabama (for minority representation). Contrary to expert testimony and Supreme Court oral arguments from Allen v. Milligan (2023), we find that fewer than k − 1 county splits suffices, even when subjected to these additional criteria. This demonstrates our narrow point that k − 1 county splits should not be assumed minimum and also suggests that districting criteria do not conflict as much as people sometimes believe. The optimization methods proposed in this article are flexible and can assist mapmakers in satisfying them. 
    more » « less
    Free, publicly-accessible full text available February 3, 2026
  3. Cliques and their generalizations are frequently used to model “tightly knit” clusters in graphs and identifying such clusters is a popular technique used in graph-based data mining. One such model is the s-club, which is a vertex subset that induces a subgraph of diameter at most s. This model has found use in a variety of fields because low-diameter clusters have practical significance in many applications. As this property is not hereditary on vertex-induced subgraphs, the diameter of a subgraph could increase upon the removal of some vertices and the subgraph could even become disconnected. For example, star graphs have diameter two but can be disconnected by removing the central vertex. The pursuit of a fault-tolerant extension of the s-club model has spawned two variants that we study in this article: robust s-clubs and hereditary s-clubs. We analyze the complexity of the verification and optimization problems associated with these variants. Then, we propose cut-like integer programming formulations for both variants whenever possible and investigate the separation complexity of the cut-like constraints. We demonstrate through our extensive computational experiments that the algorithmic ideas we introduce enable us to solve the problems to optimality on benchmark instances with several thousand vertices. This work lays the foundations for effective mathematical programming approaches for finding fault-tolerant s-clubs in large-scale networks. History: Accepted by David Alderson, Area Editor for Network Optimization: Algorithms & Applications. Funding: The computing for this project was performed at the High Performance Computing Center at Oklahoma State University supported in part through the National Science Foundation [Grant OAC-1531128]. This material is based upon work supported by the National Science Foundation under [Grants 1662757 and 1942065]. Supplemental Material: The online appendix is available at https://doi.org/10.1287/ijoc.2022.1231 . 
    more » « less
  4. Beginning in the 1960s, techniques from operations research began to be used to generate political districting plans. A classical example is the integer programming model of Hess et al. [Hess SW, Weaver JB, Siegfeldt HJ, Whelan JN, Zitlau PA ( 1965 ) Oper. Res. 13(6):998–1006.]. Because of the model’s compactness-seeking objective, it tends to generate contiguous or nearly contiguous districts, although none of the model’s constraints explicitly impose contiguity. Consequently, Hess et al. had to manually adjust their solutions to make them contiguous. Since then, there have been several attempts to adjust the Hess model and other models so that contiguity is explicitly ensured. In this paper, we review two existing models for imposing contiguity, propose two new ones, and analytically compare them in terms of their strength and size. We conduct an extensive set of numerical experiments to evaluate their performance. Although many believe that contiguity constraints are particularly difficult to deal with, we find that the districting problem considered by Hess et al. does not become harder when contiguity is imposed. In fact, a branch-and-cut implementation of a cut-based model generates, for the first time, optimally compact districting plans for 21 different U.S. states at the census tract level. To encourage future research in this area, and for purposes of transparency, we make our test instances and source code publicly available. 
    more » « less
  5. null (Ed.)
  6. null (Ed.)
    Two nodes of a wireless network may not be able to communicate with each other directly, perhaps because of obstacles or insufficient signal strength. This necessitates the use of intermediate nodes to relay information. Often, one designates a (preferably small) subset of them to relay these messages (i.e., to serve as a virtual backbone for the wireless network), which can be seen as a connected dominating set (CDS) of the associated graph. Ideally, these communication paths should be short, leading to the notion of a latency-constrained CDS. In this paper, we point out several shortcomings of a previously studied formalization of a latency-constrained CDS and propose an alternative one. We introduce an integer programming formulation for the problem that has a variable for each node and imposes the latency constraints via an exponential number of cut-like inequalities. Two nice properties of this formulation are that (1) it applies when distances are hop-based and when they are weighted and (2) it easily generalizes to ensure fault tolerance. We provide a branch-and-cut implementation of this formulation and compare it with a new polynomial-size formulation. Computational experiments demonstrate the superiority of the cut-like formulation. We also study related questions from computational complexity, such as approximation hardness, and answer an open problem regarding the fault diameter of graphs. 
    more » « less
  7. null (Ed.)
    To this day, the maximum clique problem remains a computationally challenging problem. Indeed, despite researchers’ best efforts, there exist unsolved benchmark instances with 1,000 vertices. However, relatively simple algorithms solve real-life instances with millions of vertices in a few seconds. Why is this the case? Why is the problem apparently so easy in many naturally occurring networks? In this paper, we provide an explanation. First, we observe that the graph’s clique number ω is very near to the graph’s degeneracy d in most real-life instances. This observation motivates a main contribution of this paper, which is an algorithm for the maximum clique problem that runs in time polynomial in the size of the graph, but exponential in the gap [Formula: see text] between the clique number ω and its degeneracy-based upper bound d+1. When this gap [Formula: see text] can be treated as a constant, as is often the case for real-life graphs, the proposed algorithm runs in time [Formula: see text]. This provides a rigorous explanation for the apparent easiness of these instances despite the intractability of the problem in the worst case. Further, our implementation of the proposed algorithm is actually practical—competitive with the best approaches from the literature. 
    more » « less